翻訳と辞書
Words near each other
・ "O" Is for Outlaw
・ "O"-Jung.Ban.Hap.
・ "Ode-to-Napoleon" hexachord
・ "Oh Yeah!" Live
・ "Our Contemporary" regional art exhibition (Leningrad, 1975)
・ "P" Is for Peril
・ "Pimpernel" Smith
・ "Polish death camp" controversy
・ "Pro knigi" ("About books")
・ "Prosopa" Greek Television Awards
・ "Pussy Cats" Starring the Walkmen
・ "Q" Is for Quarry
・ "R" Is for Ricochet
・ "R" The King (2016 film)
・ "Rags" Ragland
・ ! (album)
・ ! (disambiguation)
・ !!
・ !!!
・ !!! (album)
・ !!Destroy-Oh-Boy!!
・ !Action Pact!
・ !Arriba! La Pachanga
・ !Hero
・ !Hero (album)
・ !Kung language
・ !Oka Tokat
・ !PAUS3
・ !T.O.O.H.!
・ !Women Art Revolution


Dictionary Lists
翻訳と辞書 辞書検索 [ 開発暫定版 ]
スポンサード リンク

BFGS method : ウィキペディア英語版
Broyden–Fletcher–Goldfarb–Shanno algorithm
In numerical optimization, the Broyden–Fletcher–Goldfarb–Shanno (BFGS) algorithm is an iterative method for solving unconstrained nonlinear optimization problems.
The BFGS method approximates Newton's method, a class of hill-climbing optimization techniques that seeks a stationary point of a (preferably twice continuously differentiable) function. For such problems, a necessary condition for optimality is that the gradient be zero.
Newton's method and the BFGS methods are not guaranteed to converge unless the function has a quadratic Taylor expansion near an optimum. These methods use both the first and second derivatives of the function. However, BFGS has proven to have good performance even for non-smooth optimizations.
In quasi-Newton methods, the Hessian matrix of second derivatives doesn't need to be evaluated directly. Instead, the Hessian matrix is approximated using rank-one updates specified by gradient evaluations (or approximate gradient evaluations). Quasi-Newton methods are generalizations of the secant method to find the root of the first derivative for multidimensional problems. In multi-dimensional problems, the secant equation does not specify a unique solution, and quasi-Newton methods differ in how they constrain the solution. The BFGS method is one of the most popular members of this class.〔, page 24〕 Also in common use is L-BFGS, which is a limited-memory version of BFGS that is particularly suited to problems with very large numbers of variables (e.g., >1000). The BFGS-B variant handles simple box constraints.
== Rationale ==

The search direction p''k'' at stage ''k'' is given by the solution of the analogue of the Newton equation
: B_k \mathbf_k = - \nabla f(\mathbf_k)
where B_k is an approximation to the Hessian matrix which is updated iteratively at each stage, and \nabla f(\mathbf_k) is the gradient of the function evaluated at x''k''. A line search in the direction p''k'' is then used to find the next point x''k''+1. Instead of requiring the full Hessian matrix at the point x''k''+1 to be computed as ''B''''k''+1, the approximate Hessian at stage ''k'' is updated by the addition of two matrices.
:B_=B_k+U_k+V_k\,\!
Both ''Uk'' and ''Vk'' are symmetric rank-one matrices but have different (matrix) bases. The symmetric rank one assumption here means that we may write
:C=\mathbf\mathbf^\mathrm
So equivalently, ''Uk'' and ''Vk'' construct a rank-two update matrix which is robust against the scale problem often suffered in the gradient descent searching (''e.g.'', in Broyden's method).
The quasi-Newton condition imposed on this update is
:B_ (\mathbf_-\mathbf_k ) = \nabla f(\mathbf_) -\nabla f(\mathbf_k ).

抄文引用元・出典: フリー百科事典『 ウィキペディア(Wikipedia)
ウィキペディアで「Broyden–Fletcher–Goldfarb–Shanno algorithm」の詳細全文を読む



スポンサード リンク
翻訳と辞書 : 翻訳のためのインターネットリソース

Copyright(C) kotoba.ne.jp 1997-2016. All Rights Reserved.